Estimation of KL Divergence: Optimal Minimax Rate

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Minimax Estimation of KL Divergence between Discrete Distributions

We refine the general methodology in [1] for the construction and analysis of essentially minimax estimators for a wide class of functionals of finite dimensional parameters, and elaborate on the case of discrete distributions with support size S comparable with the number of observations n. Specifically, we determine the “smooth” and “non-smooth” regimes based on the confidence set and the smo...

متن کامل

Variational Minimax Estimation of Discrete Distributions under KL Loss

We develop a family of upper and lower bounds on the worst-case expected KL loss for estimating a discrete distribution on a finite numberm of points, given N i.i.d. samples. Our upper bounds are approximationtheoretic, similar to recent bounds for estimating discrete entropy; the lower bounds are Bayesian, based on averages of the KL loss under Dirichlet distributions. The upper bounds are con...

متن کامل

Minimax Optimal Additive Functional Estimation with Discrete Distribution: Slow Divergence Speed Case

This paper addresses an estimation problem of an additive functional of φ, which is defined as θ(P ;φ) = ∑ k i=1 φ(pi), given n i.i.d. random samples drawn from a discrete distribution P = (p1, ..., pk) with alphabet size k. We have revealed in the previous paper [1] that the minimax optimal rate of this problem is characterized by the divergence speed of the fourth derivative of φ in a range o...

متن کامل

Minimax Optimal Procedures for Locally Private Estimation

Working under a model of privacy in which data remains private even from the statistician,we study the tradeoff between privacy guarantees and the risk of the resulting statistical estima-tors. We develop private versions of classical information-theoretic bounds, in particular thosedue to Le Cam, Fano, and Assouad. These inequalities allow for a precise characterization ofs...

متن کامل

Color Constancy Using KL-Divergence

Color is a useful feature for machine vision tasks. However, its effectiveness is often limited by the fact that the measured pixel values in a scene are influenced by both object surface reflectance properties and incident illumination. Color constancy algorithms attempt to compute color features which are invariant of the incident illumination by estimating the parameters of the global scene ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Information Theory

سال: 2018

ISSN: 0018-9448,1557-9654

DOI: 10.1109/tit.2018.2805844